Communication-Efficient and Byzantine-Robust Distributed Learning With Error Feedback
نویسندگان
چکیده
We develop a communication-efficient distributed learning algorithm that is robust against Byzantine worker machines. propose and analyze gradient-descent performs simple thresholding based on gradient norms to mitigate failures. show the (statistical) error-rate of our matches Yin et al. (2018), which uses more complicated schemes (coordinate-wise median, trimmed mean). Furthermore, for communication efficiency, we consider generic class $\delta $ -approximate compressors from Karimireddi (2019) encompasses sign-based top- notation="LaTeX">$k$ sparsification. Our compressed gradients aggregation removal respectively. establish statistical error rate non-convex smooth loss functions. that, in certain range compression factor , (order-wise) convergence not affected by operation. Moreover, descent with feedback (proposed 2019) setting presence exploiting improves rate. Finally, experimentally validate results good performance convex (least-square regression) (neural network training) problems.
منابع مشابه
Byzantine-Robust Distributed Learning: Towards Optimal Statistical Rates
In large-scale distributed learning, security issues have become increasingly important. Particularly in a decentralized environment, some computing units may behave abnormally, or even exhibit Byzantine failures—arbitrary and potentially adversarial behavior. In this paper, we develop distributed learning algorithms that are provably robust against such failures, with a focus on achieving opti...
متن کاملNew Efficient Error-Free Multi-Valued Consensus with Byzantine Failures
Any opinions, findings, and conclusions or recommendations expressed here are those of the authors and do not necessarily reflect the views of the funding agencies or the U.S. government.
متن کاملCommunication-Efficient Distributed Learning of Discrete Distributions
We initiate a systematic investigation of distribution learning (density estimation) when the data is distributed across multiple servers. The servers must communicate with a referee and the goal is to estimate the underlying distribution with as few bits of communication as possible. We focus on non-parametric density estimation of discrete distributions with respect to the `1 and `2 norms. We...
متن کاملGeneral and Robust Communication-Efficient Algorithms for Distributed Clustering
As datasets become larger and more distributed, algorithms for distributed clustering have become more and more important. In this work, we present a general framework for designing distributed clustering algorithms that are robust to outliers. Using our framework, we give a distributed approximation algorithm for k-means, k-median, or generally any `p objective, with z outliers and/or balance ...
متن کاملCommunication Efficient Distributed Machine Learning with the Parameter Server
This paper describes a third-generation parameter server framework for distributed machine learning. This framework offers two relaxations to balance system performance and algorithm efficiency. We propose a new algorithm that takes advantage of this framework to solve non-convex non-smooth problems with convergence guarantees. We present an in-depth analysis of two large scale machine learning...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: IEEE journal on selected areas in information theory
سال: 2021
ISSN: ['2641-8770']
DOI: https://doi.org/10.1109/jsait.2021.3105076